60 research outputs found

    Multi-Hazard Performance Assessment of High-Rise Buildings

    Get PDF
    In the last few decades, there has been an important increase in building high-rise constructions in many cities around the world. Since they offer several benefits in populous areas in terms of space efficiency, economy and sustainability, tower buildings attracted practitioners and researchers to understand better their exclusive behavior and response to natural hazards (e.g., hurricanes, earthquakes). Because of their flexibility and their commonly limited damping, skyscrapers are more susceptible to wind and earthquake actions than low- and mid-rise buildings. Moreover, many locations are prone to multiple hazards; hence, it is important to understand thoroughly the structural behavior of structures undergoing the effect of each hazard separately in order to obtain better designs. In this study, the general methodology of performance-based loss assessment is applied to a hypothetical 74-story office building located in Miami, FL, and New Madrid, MO. Seismic hazard, wind hazard, and hurricane hazard are considered. The expected losses related to the seismic hazard are evaluated following the Performance-Based Earthquake Engineering (PBEE) framework proposed by the Pacific Earthquake Engineering Research (PEER) center; whereas the Performance-Based Wind Engineering (PBWE) and the Performance-Based Hurricane Engineering (PBHE) frameworks are used to calculate the losses corresponding to wind- and hurricane-induced actions on the same building. The monetary losses considered include those due to damage to structural and non-structural components, as well as those due to occupants’ discomfort. The results from the two analyses are compared to each other to form a consistent foundation for future investigations of the appropriate mitigation techniques (e.g., using dampers) to minimize the total expected losses for the considered building when taking into account both hazards. This research is a first step toward a general approach to multi-hazard performance-based engineering and uniform risk design for multiple hazards

    Improving lifecycle query in integrated toolchains using linked data and MQTT-based data warehousing

    Full text link
    The development of increasingly complex IoT systems requires large engineering environments. These environments generally consist of tools from different vendors and are not necessarily integrated well with each other. In order to automate various analyses, queries across resources from multiple tools have to be executed in parallel to the engineering activities. In this paper, we identify the necessary requirements on such a query capability and evaluate different architectures according to these requirements. We propose an improved lifecycle query architecture, which builds upon the existing Tracked Resource Set (TRS) protocol, and complements it with the MQTT messaging protocol in order to allow the data in the warehouse to be kept updated in real-time. As part of the case study focusing on the development of an IoT automated warehouse, this architecture was implemented for a toolchain integrated using RESTful microservices and linked data.Comment: 12 pages, worksho

    Linked Data Architecture for Plan Execution in Distributed CPS

    Get PDF
    International audienceFuture cyber-physical systems (CPS) require their components to perform autonomously. To do that safely and efficiently, CPS components will need access to the global state of the whole CPS. These components will require near real-time updates to a subset of the global state to react to changes in the environment. A particular challenge is to monitor state updates from the distributed CPS components: one needs to ensure that only states consistent with the PDDL plan execution semantics can be observed within the system. In order to guarantee that, a component to monitor plan execution is proposed. Microservices based on Linked Data technologies are used to provide a uniform way to access component states, represented as Resource Description Framework (RDF) resources. To ensure the correct ordering of state updates , we present an extension of the OASIS OSLC TRS protocol. Specifically, we strengthen the ordering guarantees of state change events and introduce inlining of the state with the events to prevent state mismatch at the dereferencing stage

    Efficient State Update Exchange in a CPS Environment for Linked Data-based Digital Twins

    Get PDF
    International audienceThis paper addresses the problem of reducing the number of messages needed to exchange state updates between the Cyber-Physical System (CPS) components that integrate with the rest of the CPS through Digital Twins in order to maintain uniform communication interface and carry out their tasks correctly and safely. The main contribution is a proposed architecture and the discussion of its suitability to support correct execution of complex tasks across the CPS. A new State Event Filtering component is presented to provide event-based communication among Digital Twins that are based on the Linked Data principles while keeping the fan-out limited to ensure the scalability of the architecture

    Tools for Real-Time Control Systems Co-Design : A Survey

    Get PDF
    This report presents a survey of current simulation tools in the area of integrated control and real-time systems design. Each tool is presented with a quick overview followed by a more detailed section describing comparative aspects of the tool. These aspects describe the context and purpose of the tool (scenarios, development stages, activities, and qualities/constraints being addressed) and the actual tool technology (tool architecture, inputs, outputs, modeling content, extensibility and availability). The tools presented in the survey are the following; Jitterbug and TrueTime from the Department of Automatic Control at Lund University, Sweden, AIDA and XILO from the Department of Machine Design at the Royal Institute of Technology, Sweden, Ptolemy II from the Department of Electrical Engineering and Computer Sciences at Berkeley, California, RTSIM from the RETIS Laboratory, Pisa, Italy, and Syndex and Orccad from INRIA, France. The survey also briefly describes some existing commercial tools related to the area of real-time control systems

    294 Implementation time of a lipid lowering therapy in patients with dyslipidemia: results of Prysme study

    Get PDF
    Despite the availability of specific guidelines, the management of dyslipidemia in practice is not optimal.Objective and methodologyPRYSME, a non-interventional multicentre study carried out with 1226 general practitioners, aimed to describe the implementation time of a lipid lowering treatment according to cardiovascular risk level (primary objective) and to identify its determinants. Were eligible patients treated for a dyslipidemia diagnosed less than 2 years ago. Demographic and clinical characteristics and circumstances of diagnosis and treatment initiation were collected.Results3268 patients were included (mean age: 57 years old, males: 64%). 26% were obese and 45% overweight. Only 12% had no cardiovascular risk factors (CRF) at the time of dyslipidemia diagnosis. The most frequent CRF were arterial hypertension (50%), smoking (43%), family history of premature coronary heart disease (28%), HDL-c <0.4g/l (20%) whereas 15% of the patients had a personal history of cardiovascular disease. Dietary programs were initially implemented for 98% of the patients. More than 90% were treated with a statin. The implementation time of the treatment (evaluated according to the biological confirmation of dyslipidemia), according to the initial number of CRF, was as following:0 CRF1 CRF2 CRF≥ 3 CRFSecondary preventionTotal[-3;0] months34.3%28.6%27.1%29.3%49.1%33.1%]0;3] months23.1%26.2%26.4%24.0%21.9%23.9%> 3 months42.6%45.3%46.5%46.8%29.0%43.0%Chi-2 test : P<0.001The main determinant of an early implementation of a lipid lowering therapy (≤ 3 months) was secondary prevention (OR=1.8). The number of CRF had no significant impact.ConclusionThis study underlines the lack of awareness towards cardiovascular risk factors in the management of dyslipidemia, particularly while considering the implementation time of a lipid lowering therapy

    An Analysis of the OASIS OSLC Integration Standard, for a Cross-disciplinary Integrated Development Environment : Analysis of market penetration, performance and prospects

    No full text
    OASIS OSLC is a standard that targets the integration of engineering software applications. Its approach promotes loose coupling, in which each application autonomously manages its own product data, while providing RESTful web services through which other applications can interact. This report aims to analyse the suitability of OSLC as an overarching integration mechanism for the complete set of engineering activities of Cyber Physical Systems (CPS) development. To achieve this, a review of the current state of the OASIS OSLC integration standard is provided in terms of its market penetration in commercial applications, its capabilities, and the architectural qualities of OSLC-based solutions. This review is based on a survey of commercial software applications that provide some support for OSLC capabilities.QC 20200430</p

    An Analysis of the OASIS OSLC Integration Standard, for a Cross-disciplinary Integrated Development Environment : Analysis of market penetration, performance and prospects

    No full text
    OASIS OSLC is a standard that targets the integration of engineering software applications. Its approach promotes loose coupling, in which each application autonomously manages its own product data, while providing RESTful web services through which other applications can interact. This report aims to analyse the suitability of OSLC as an overarching integration mechanism for the complete set of engineering activities of Cyber Physical Systems (CPS) development. To achieve this, a review of the current state of the OASIS OSLC integration standard is provided in terms of its market penetration in commercial applications, its capabilities, and the architectural qualities of OSLC-based solutions. This review is based on a survey of commercial software applications that provide some support for OSLC capabilities.QC 20200430</p

    A Model Management and Integration Platform for Mechatronics Product Development

    No full text
    Mechatronics development requires the close collaboration of various specialist teams and engineering disciplines. Developers from the different disciplines use domain-specific tools to specify and analyse the system of interest. This leads to different views of the system, each targeting a specific audience, using that audience’s familiar language, and concentrating on that audience’s concerns. Successful system development requires that the views of all developers produced by the different tools are well integrated into a whole, reducing any risks of inconsistencies and conflicts in the design information specified. This thesis discusses techniques of managing and integrating the views from various disciplines, taking better advantage of multidisciplinary, model-based, development. A Model Data Management (MDM) platform that generically manages models from the various domain-specific tools used in development is presented. The platform is viewed as a unification of the management functionalities typically provided by the discipline-specific PDM and SCM systems. The unification is achieved by unifying the kind of objects it manages – models. View integration is considered as an integral functionality of this platform. In demonstrating the platform’s feasibility, a generic version management functionality of models is implemented. In addition, model integration is investigated for the allocation of system functions onto the implementing hardware architecture. The proposed approach promotes the independent development of the views, allowing developers from each discipline to work concurrently, yet ensuring the completeness, correctness and analysis of any inter-view design decisions made. The prototype MDM platform builds on existing technologies from each of the mechanical and software disciplines. The proposed MDM system is built based on a configurable PDM system, given its maturity and ability to manage model contents appropriately. At the same time, the version control functionality borrows ideas from the fine-grained version control algorithms in the software discipline. The platform is argued to be feasible given the move towards model-based development in software engineering, bringing the discipline’s needs closer to those of the hardware discipline. This leads the way for an easier and more effective integrated management platform satisfying the needs of both disciplines using a common set of mechanisms.QC 2011012
    • …
    corecore